The GDPR deadline is just over the horizon and there can be few, if any, organisations who are not at least aware of the regulation designed to replace the Data Protection Act and, especially, the ‘life-threatening’ fines that are ‘promised’ to those that fall foul of the new legislation. Whether corporations will be sued because they fail to act swiftly and comprehensively enough when one or more individual exercises their right to be forgotten (somewhat at odds with governments’ desire to know everything about us, just in case we have terrorist tendencies) or because they have suffered a data breach – even though they notified everyone affected in 12 hours, rather than the 72 hours allowed – remains to be seen. After the hype will come the reality. And whatever does or doesn’t happen, I’m prepared to bet that all those fly-by-night organisations that phone, email and generally harass us with some kind of marketing scam will continue to do so!
As I’m fairly sure I’ve stated before, so long as you can demonstrate that your company has followed all the right procedures and not cut any corners, you will be extremely unfortunate to find yourself in court and even more unlucky to be fined large sums of money – even a government might blush to take money from ‘offenders’ while one or more of its own departments suffers embarrassing data breaches.
And then there’s the issue of what is an appropriate level of security for any particular organisation. There is no such thing as total security (even the IT experts admit as much), but money can buy some fairly robust cyber defences, but what proportion of a company’s turnover/profits should be spent on cybersecurity?
The legal industry will get fat on some early test cases no doubt, and only then will we begin to understand whether the GDPR is, as the hype would have us believe, a real game changer, or possibly not much different from the status quo.
Still, whatever happens, the silver lining is that companies all over the globe have spent much needed time and effort cleaning and qualifying their databases – so maybe there won’t be so many junk emails floating around in the future!
Rather than simply “ripping and replacing” legacy systems in favour of hot new technologies, IT decision makers in EMEA are focused on finding the optimal balance between traditional storage and emerging on- and off-premises solutions.
Their aim is to provide the best possible storage infrastructure to meet their business needs now and prepare for future demands. These findings are amongst the results of a recent survey by the Enterprise Strategy Group (ESG), an integrated IT research, analyst, strategy, and validation firm. As the industry enters 2018, ESG sees the emergence of a new trend, which it has called “hybrid-cloud-defined” to encapsulate the essence, and which will continue to influence spending decisions throughout the year. As a result, the majority of IT decision makers in EMEA has moved at least one workload back from the cloud to on-premises in accordance with business needs. The ESG report surveyed over 410 EMEA IT professionals responsible for evaluating, purchasing and managing data in medium-sized companies and large enterprises about their storage challenges and purchasing dynamics and drivers for the coming 12-18 months.
The ESG study revealed IT organisations are primarily focused on understanding the practical benefits of new technologies and how these can be best leveraged to meet their individual needs, rather than making snap decisions to invest in the “next big thing”. The ultimate goal seems to be working to identify the optimal mix of technologies to deliver the necessary IT storage capabilities and meet corporate requirements.
Storage Challenges
The most common challenge according to the ESG survey respondents is data protection, with 17% of respondents selecting it as their primary storage issue. This, together with hardware costs and rapid data growth, make up the top three. These challenges create a perfect storm facing IT decision makers: data growth is accelerating and the infrastructure required to store and protect this data is still all-too-often perceived as costly and complex.
Storage Spending
Not surprisingly, the vast majority of respondents (94%) reported that their spending on on-premises data storage is either accelerating (49%) or remaining constant (45%). Among those respondents with either flat or decreasing on-premises storage spending, the leading contributor involved the cloud as the primary factor – 24% identified using more cloud applications, while 20% attested to using more cloud infrastructure services. Innovations in on-premises infrastructure are also having an impact, since 19% of the respondents identified being able to store data more efficiently as the primary reason for their on-premises storage spending growth either leveling off or decelerating.
IT Initiatives
In terms of IT initiatives that will influence storage spending over the next 12-18 months, more than a third (36%) of respondents anticipate leveraging cloud storage to increase capacity without purchasing more on-premises storage, while 26% expect to use more cloud-based applications. While this indicates that on-premises storage spending will fall, many of the remaining responses have positive implications for on-premises storage investments, for example the prevalence of data-intensive workloads such as analytics (36%) and the Internet of Things (31%).
Interestingly, however, while “the cloud” is impacting on-premises storage spending, when storage decision makers were polled about their use of off-premises cloud resources, 57% stated that they had moved at least one workload from a cloud software or infrastructure service back to on-premises resources – indicating a notable element of cloud repatriation.
“Until recently, the storage industry was characterised by continual, steady progress. There were always bigger and/or faster systems, more sophisticated features, and some variations of deployment, but many of the underlying foundations were, for the most part, the same,” said Mark Peters, practice director and senior analyst, Enterprise Strategy Group. “However, ESG’s recent EMEA research clearly reveals that the enterprise storage industry is now in a period of dramatic changes, with IT professionals attempting to find the optimal balance between storage types. We believe we are approaching what we might call a “hybrid-cloud-defined” data ecosystem.”
Study reveals regional disparities in adoption of cloud security: German businesses almost twice as likely to secure confidential or sensitive information in the cloud (61%) than British (35%), Brazilian (34%) and Japanese (31%) organisations.
Gemalto can today reveal that while the vast majority of global companies (95%) have adopted cloud services, there is a wide gap in the level of security precautions applied by companies in different markets. Organisations admitted that on average, only two-fifths (40%) of the data stored in the cloud is secured with encryption and key management solutions.
The findings – part of a Gemalto commissioned Ponemon Institute “2018 Global Cloud Data Security Study” – found that organisations in the UK (35%), Brazil (34%) and Japan (31%) are less cautious than those in Germany (61%) when sharing sensitive and confidential information stored in the cloud with third parties. The study surveyed more than 3,200 IT and IT security practitioners worldwide to gain a better understanding of the key trends in data governance and security practices for cloud-based services.
Germany’s lead in cloud security extends to its application of controls such as encryption and tokenisation. The majority (61%) of German organisations revealed they secure sensitive or confidential information while being stored in the cloud environment, ahead of the US (51%) and Japan (50%). The level or security applied increases further still when data is sent and received by the business, rising to 67% for Germany, with Japan (62%) and India (61%) the next highest.
Crucially, however, over three quarters (77%) of organisations across the globe recognise the importance of having the ability to implement cryptologic solutions, such as encryption. This is only set to increase, with nine in 10 (91%) believing this ability will become more important over the next two years – an increase from 86% last year.
Managing privacy and regulation in the cloud
Despite the growing adoption of cloud computing and the benefits that it brings, it seems that global organisations are still wary. Worryingly, half report that payment information (54%) and customer data (49%) are at risk when stored in the cloud. Over half (57%) of global organisations also believe that using the cloud makes them more likely to fall foul of privacy and data protection regulations, slightly down from 62% in 2016.
Due to this perceived risk, almost all (88%) believe that the new General Data Protection Regulation (GDPR), will require changes in cloud governance, with two in five (37%) stating it would require significant changes. As well as difficulty in meeting regulatory requirements, three-quarters of global respondents (75%) also reported that it is more complex to manage privacy and data protection regulations in a cloud environment than on premise networks, with France (97%) and the US (87%) finding this the most complex, just ahead of India (83%).
Head in the clouds
Despite the prevalence of cloud usage, the study found that there is a gap in awareness within businesses about the services being used. Only a quarter (25%) of IT and IT security practitioners revealed they are very confident they know all the cloud services their business is using, with a third (31%) confident they know.
Looking more closely, shadow IT may be continuing to cause challenges. Over half of Australian (61%), Brazilian (59%) and British (56%) organisations are not confident they know all the cloud computing apps, platform or infrastructure services their organisation is using. Confidence is higher elsewhere, with only around a quarter in Germany (27%), Japan (27%) and France (25%) not confident.
Fortunately, the vast majority (81%) believe that having the ability to use strong authentication methods to access data and applications in the cloud is essential or very important. Businesses in Australia are the keenest to see authentications put in place, with 92% agreeing it would help ensure only authorised people could access certain data and applications in the cloud, ahead of India (85%) and Japan (84%).
“While it’s good to see some countries like Germany taking the issue of cloud security seriously, there is a worrying attitude emerging elsewhere,” said Jason Hart, CTO, Data Protection at Gemalto. “This may be down to nearly half believing the cloud makes it more difficult to protect data, when the opposite is true.
“The benefit of the cloud is its convenience, scalability and cost control in offering options to businesses that they would not be able to access or afford on their own, particularly when it comes to security. However, while securing data is easier, there should never be an assumption that cloud adoption means information is automatically secure. Just look at the recent Accenture and Uber breaches as examples of data in the cloud that has been left exposed. No matter where data is, the appropriate controls like encryption and tokenisation need to be placed at the source of the data. Once these are in place, any issues of compliance should be resolved.”
Cisco has released the seventh annual Cisco® Global Cloud Index (2016-2021). The updated report focuses on data center virtualization and cloud computing, which have become fundamental elements in transforming how many business and consumer network services are delivered.
According to the study, both consumer and business applications are contributing to the growing dominance of cloud services over the Internet. For consumers, streaming video, social networking, and Internet search are among the most popular cloud applications. For business users, enterprise resource planning (ERP), collaboration, analytics, and other digital enterprise applications represent leading growth areas.
Strong Multicloud Traffic Growth Projected
Driven by surging cloud applications, data center traffic is growing fast. The study forecasts global cloud data center traffic to reach 19.5 zettabytes (ZB) per year by 2021, up from 6.0 ZB per year in 2016 (3.3-fold growth or a 27 percent compound annual growth rate [CAGR] from 2016 to 2021). Globally, cloud data center traffic will represent 95 percent of total data center traffic by 2021, compared to 88 percent in 2016.
Improved Security and IoT Fuel Cloud Growth
In the past, security concerns have been a major barrier to cloud adoption. Improvements in data center governance and data control have helped to minimize enterprise risk and better protect consumer information. Security innovations coupled with tangible cloud computing benefits, including scalability and economies of scale, play key roles in fueling the cloud growth projected in the study. Additionally, the growth of Internet of Things (IoT) applications such as smart cars, smart cities, connected health and digital utilities requires scalable computing and storage solutions to accommodate new and expanding data center demands. By 2021, Cisco expects IoT connections to reach 13.7 billion, up from 5.8 billion in 2016.
Hyperscale Data Centers Doubling
The increasing need for data center and cloud resources has led to the development of large-scale public cloud data centers called hyperscale data centers. In this year’s forecast, it is expected that by 2021 there will be 628 hyperscale data centers globally, compared to 338 in 2016, 1.9-fold growth or near doubling over the forecast period. By 2021, hyperscale data centers will support:
“Data center application growth is clearly exploding in this new multicloud world. This projected growth will require new innovations especially in the areas of public, private and hybrid clouds,” said Kip Compton, Vice President of Cisco's Cloud Platform and Solutions Group.
Global Cloud Index Highlights and Key Projections:
1. Data center virtualization and cloud computing growth
By 2021, 94 percent of workloads and compute instances will be processed by cloud data centers; 6 percent will be processed by traditional data centers.
Overall data center workloads and compute instances will more than double (2.3-fold) from 2016 to 2021; however, cloud workloads and compute instances will nearly triple (2.7-fold) over the same period.
The workload and compute instance density for cloud data centers was 8.8 in 2016 and will grow to 13.2 by 2021. Comparatively, for traditional data centers, workload and compute instance density was 2.4 in 2016 and will grow to 3.8 by 2021.
2. Growth in stored data fueled by big data and IoT
Globally, the data stored in data centers will nearly quintuple by 2021 to reach 1.3 ZB by 2021, up 4.6-fold (a CAGR of 36%) from 286 EB in 2016.
Big data will reach 403 exabytes (EB) by 2021, up almost 8-fold from 25 EB in 2016. Big data will represent 30 percent of data stored in data centers by 2021, up from 18 percent in 2016.
The amount of data stored on devices will be 4.5 times higher than data stored in data centers, at 5.9 ZB by 2021.
Driven largely by IoT, the total amount of data created (and not necessarily stored) by any device will reach 847 ZB per year by 2021, up from 218 ZB per year in 2016. Data created is two orders of magnitude higher than
data stored.
3. Applications contribute to rise of global data center traffic
By 2021, big data will account for 20 percent (2.5 ZB annual, 209 EB monthly) of traffic within data centers, compared to 12 percent (593 EB annual, 49 EB monthly) in 2016.
By 2021, video streaming will account for 10 percent of traffic within data centers, compared to 9 percent in 2016.
By 2021, video will account for 85 percent of traffic from data centers to end users, compared to 78 percent in 2016.
By 2021, search will account for 20 percent of traffic within data centers by 2021, compared to 28 percent in 2016.
By 2021, social networking will account for 22 percent of traffic within data centers, compared to 20 percent in 2016.
4. SaaS most popular cloud service model through 2021
By 2021, 75 percent (402 million) of the total cloud workloads and compute instances will be SaaS workloads and compute instances, up from 71 percent (141 million) in 2016. (23% CAGR from 2016 to 2021).
By 2021, 16 percent (85 million) of the total cloud workloads and compute instances will be IaaS workloads and compute instances, down from 21 percent (42 million) in 2016. (15% CAGR from 2016 to 2021).
By 2021, 9 percent (46 million) of the total cloud workloads and compute instances will be PaaS workloads and compute instances, up from 8% (16 million) in 2016. (23% CAGR from 2016 to 2021).
For the purposes of the study, cloud computing includes platforms that enable ubiquitous, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction. Deployment models include private, public, and hybrid clouds. Cloud data centers can be operated by service providers as well as private enterprises.
Gartner has published the 2018 Gartner Peer Insights Customers' Choice for Data Center Backup and Recovery Software. The Peer Insights Customers' Choice is a recognition of vendors in this market by verified end-user professionals, taking into account both the number of end-user reviews and the overall user ratings.
The Data Center Backup and Recovery Software market is focused on providing backup capabilities for the upper-end midmarket and large-enterprise environments. Protected data comprises data center workloads, such as file share, file system, operating system, database, email, content management, customer relationship management (CRM), enterprise resource planning (ERP) and collaboration application data.
Backup and recovery software products provide features, such as traditional backup to tape, backup to conventional random-access media or devices that emulate the previous backup targets, data reduction, array and/or server-based snapshot, heterogeneous replication and continuous data protection (CDP)."Gartner Peer Insights Customers' Choice is designed to help IT professionals make better purchase decisions by leveraging the knowledge and feedback by fellow verified end users," said Ken Davis, executive vice president of Products and Services at Gartner. "We work hard to ensure that all of the reviews are submitted by an IT professional or business user who has had experience purchasing, implementing and/or using the enterprise solution they are reviewing — free from conflicts of interest."
The list of companies selected for the 2018 Gartner Peer Insights Customers' Choice is included in Table 1. More information on the 2018 Customers' Choice is available at https://www.gartner.com/reviews/customers-choice/data-center-backup-and-recovery-software.
Table 1
2018 Gartner Peer Insights Customers' Choice for Data Center Backup and Recovery Software
Vendor |
Acronis |
Actifio |
Cohesity |
Commvault |
SEP |
Unitrends |
Veeam |
Note: Companies are listed in alphabetical order.
Source: Gartner (February 2018)
Managed service provider (MSP), MKBonlinediensten.nl has selected Scale Computing's HC3 cluster. Since deployment, the company has delivered high performance, increased speed and simplified management for its customers across the Netherlands.
MKBonlinediensten.nl is dedicated to providing its customers with online technology services to support business development and growth for local SMBs. The company delivers a range of solutions, offering a secure online workplace, that enables customers to use and deploy the latest technology services in the industry. As an MSP, it is critical that the company delivers the latest solutions and meets growing customer demands. As part of this MKBonlinediensten.nl looked at enhancing its online service for the workplace allowing it to grow to the next level.
After researching the market and following a successful proof of concept the MSP opted for Scale Computing’s HC3 platform. Integrating storage, servers, high availability and virtualisation into one appliance, the HC3 solution is designed to offer simplicity. Since the deployment, MKBonlinediensten.nl has been able to reduce the installation time of new customer requests by over 85% and has benefited from reduced management time and cost savings.
Alongside this, MKBonlinediensten.nl has been able to improve customer satisfaction. Due to the ease of use and reduced management time, the company now offers its customers the ability to build temporary servers, a new platform that is attracting more customers to look at utilising the MSPs services. MKBonlinediensten.nl has also taken advantage of Scale Computing’s native HEAT feature (HyperCore Enhanced Automated Tiering) which increases speed and performance as and when is needed.
Richard Druppers, owner of MKBonlinediensten.nl noted, “With our previous IT environment we needed to patch up the system and reboot our IT which would take time. Scale Computing has completely eliminated this with regular, simple updates. The Scale Computing solution was very easy to handle, manage and deploy. It provided the same benefits as competing platforms but was much simpler and it suited all our requirements. The flexibility Scale Computing provides has also saved us a great deal of time and our customers have been pleasantly surprised at how quickly we can get back to them.”
Johan Pellicaan, VP and MD EMEA at Scale Computing commented: “MKBonlinediensten.nl is dedicated to providing SMBs with the best solutions available and we are pleased to be able to work with the company to enhance its offering. The HC3 solution is designed for organisations looking to streamline and reduce the complexity of virtualisation, while providing the performance, speed and scalability needed to help growing organisations. SMBs have many of the same needs of enterprises, but it is key to have the flexibility required to grow and expand.”
Manhattan Associates, the leading supply chain and omni-channel commerce solutions provider, uses Cohesity to save time and money on data protection and cloud integration.
Manhattan Associates manages approximately one petabyte of data. Before implementing the Cohesity solution, Manhattan’s legacy secondary storage system lacked scalability and made the process of protecting its virtual machines overly complex. Following a proof-of-concept, Manhattan Associates replaced the legacy environment with Cohesity DataPlatform, a web-scale, hyperconverged secondary storage solution.
The deployment included Cohesity File Services management in concert with Cohesity CloudTier, which enabled provisioning of NFS datastores and support for SMB protocols to consolidate data protection and file storage. The company also used Cohesity CloudArchive to migrate inactive, cold virtual machines natively to AWS S3 for long-term archival to Amazon Glacier, eliminating dependence on tapes. The data in the cloud is deduplicated and compressed for storage efficiency. The implementation was done in partnership with SHI, one of America’s top 10 largest IT solutions providers, and technology partners AWS, Pure Storage, and VMware.
Manhattan Associates achieved several important benefits after installing Cohesity’s solutions:
Capital expenses (CapEx) reduction by using commodity hardware that is configured optimally for CohesityOperating expenses (OpEx) savings of administrators’ time, training and management costs by consolidating to Cohesity’s all-in-one solution that combines backup software, target storage, scale-out file storage, and cloud integrationUpgrade autonomy, as the IT team can now perform upgrades, updates, and capacity expansions without causing system disruptionsFaster backup windows that reduce administrators’ time troubleshootingBudget optimisation by leveraging a scale-out, pay-as-you-grow model.
“As a global technology solutions provider, we like to stay ahead of the technology curve. We saw clear value in adopting Cohesity to consolidate and simplify our secondary storage infrastructure,” said Brian Sweeney, principal engineer, Manhattan Associates. “Cohesity allows us to consolidate data protection and file storage, while providing simple integration with the public cloud for long-term archival”.
Acronis’ expertise in data protection will ensure ultimate security and availability of all data, resulting in improved performance on the track for Williams Martini Racing Formula One team.
Acronis and Williams Martini Racing Formula One team have formed a new technology partnership. As part of the agreement, Acronis will deliver innovative data protection solutions, including backup, disaster recovery, software-defined storage, and file sync and share.
Formula One is one of the world’s most technologically advanced sports. Every grand prix weekend Formula One teams capture hundreds of gigabytes of telemetry data, and produce terabytes of engineering and test data at the factory. Data analysis fuels innovation and technological development. The ability to interpret the data and make informed decisions is often what sets teams apart, making data the most valuable asset in the race toward the finish line.
Acronis’ expertise in data protection will assist Williams to deal with the growing volumes of data without compromising the security and flexibility mandated by Formula One. Through its partnership with Acronis, Williams will be able to access a full set of innovative data protection solutions, including:
Acronis also integrates a unique artificial intelligence-based active ransomware protection technology into Acronis Backup. As new ransomware strains increasingly target backup files, Acronis’ solution protects backups while adding another level of defence to the entire system.
Acronis’ products are already used by the top automotive and manufacturing companies worldwide. Racing teams and corporations choose Acronis solutions for the performance and reliability required in high-pressure manufacturing environments.
“Technical innovation is at the heart of everything we do at Williams, and with that comes a crucial need to protect our data,” noted Claire Williams, Deputy Team Principal, Williams Martini Racing. “Acronis will protect Williams’ on-premise and cloud service data with backup, disaster recovery, and secure file sync and share solutions. We are delighted to be partnering with Acronis whose values mirror our own to push technology and innovation. We look forward to them helping to deliver practical solutions throughout the coming season to support our racing efforts.”
“Acronis is at the leading edge of data protection technology and continuing to push. Speed, technology, innovation, and a never-give-up attitude are at the heart of our DNA and this is what unites us with Williams. Acronis’ data protection solutions are perfectly suited for Williams data-intensive environment. We’re looking forward to a productive season together,” said John Zanni, President of Acronis.
NexStor's advice and expertise have helped Kings College Cambridge to future proof its data storage capabilities by allowing it to almost triple its storage capacity from 17TB to 45TB, significantly enhance performance and considerably accelerate data recovery.
NexStor's advice and expertise have helped Kings College Cambridge to future proof its data storage capabilities by allowing it to almost triple its storage capacity from 17TB to 45TB, significantly enhance performance and considerably accelerate data recovery.
Andrews said: “We talked to several potential partners and NexStor really stood out. The team was the quickest to understand what we needed, was very easy to work with, and came highly recommended by a neighbouring college. NexStor’s experience and knowledge meant it asked us the right questions, put forward the most suitable solution that fitted our budget, and then quickly implement it. We were up and running in a week and are very impressed with the results so far.”
As well as scalability, the college also needed its new solution to support its existing VMware software and predicted updates over the next five years. NexStor recommended a Nimble CS300 SAN and its team was able to carry out migration and implementation within a week. Its service also includes aftercare, so Kings College can now easily get support if it runs into any teething problems. The college has seen a notable improvement in the performance of its virtual estate and applications. Trouble-shoot capabilities are better too, and the Veeam backup system installed by NexStor has enabled the restoration of data more granularly, alongside enhanced backup reporting.
NexStor director Troy Platts said: “Kings College came to us with a familiar problem – its existing storage solution had been designed more than five years ago. It was not prepared for the enormous growth in data volumes that we’ve all experienced since then, and was certainly not going to cope with the predicted increase in demand for storage space in the coming five years. We made it our priority to find the college a solution that would grow with it and that would offer a little more too. By recommending the Nimble SAN, we were able to address performance issues and provide the peace of mind that comes with more reliable recovery procedures.”
ITV, the UK’s largest commercial television network, has selected two Spectra® BlackPearl® Converged Storage Systems and two Spectra® T950 Tape Libraries to protect and preserve the organisation’s digital assets long-term.
The solution enables ITV to send digital assets to dispersed data centre locations on differing media types, to assure the ultimate safety and security of their content with a genetically diverse data preservation strategy.
“Spectra has met all of ITV’s criteria, providing a simple, affordable and easy-to-use digital archive that is purpose-built to meet our needs,” said Marcel Mester, senior project manager, ITV. “We are confident in the ability of our Spectra solution to preserve our video content for future generations.”
ITV is an integrated producer-broadcaster that creates, owns and distributes high-quality content on multiple platforms. The network produces massive amounts of digital content that needs to be stored for many years. ITV was looking for a solution that was high-capacity, durable and scalable to support their current needs and future growth. ITV also required a system that was non-proprietary, open standard and highly flexible so that several creative departments within ITV could easily access and move content to and from the organisation’s archive.
ITV selected two Spectra BlackPearl Converged Storage Systems and two Spectra T950 Tape Libraries to be utilised at two separate data centre locations: Greenwich and Leeds. The Greenwich data centre deployed a BlackPearl and T950 with IBM® TS1150 drives, and the Leeds data centre deployed a BlackPearl and T950 with LTO-7 tape drives. Data is moved to and from the organisation’s archive automatically via BlackPearl using a variety of integrated partner applications and a Customer Created Client (CCC) that ITV built through Spectra’s BlackPearl Developer Programme. They also move data manually via Spectra’s BlackPearl® Eon Browser.
“As an existing Spectra customer, ITV was familiar with Spectra’s products and services and confident in our ability to deliver exactly what they needed in a digital archive solution,” said Brian Grainger, chief sales officer, Spectra Logic. “The combination of Spectra’s BlackPearl and tape libraries streamlines ITV’s digital workflow, eliminating costly middleware and simpli¬fying the management of their assets. It keeps their content safe and secure, while still allowing creative departments to quickly access data as needed.”
International engineering enterprise implements Commvault for management and future proofing of dispersed data estate including full migration to the cloud, data protection and compliance
Laing O’Rourke, an international engineering enterprise, has implemented the Commvault Data Platform to centralise and take control of its data management in Europe through the scalable and secure solution. Initially implemented to address the backup needs of a dispersed mobile workforce, Commvault’s solution has now empowered Laing O’Rourke to completely rethink its whole data management strategy and enabled additional layers of protection and cloud migration capabilities.
Initially Laing O’Rourke was in search of a solution that enabled it to move away from legacy tape backup, but by switching to Commvault, it gained the ability to accelerate its adoption of public cloud, PaaS and hyper-converged infrastructure. Commvault’s software is cloud native, so migrating to the cloud comes with the reassurance that data is protected in the same way it has been on-premises, which was a key concern when looking to adopt cloud. This capability has also enabled Laing O’Rourke to adopt cloud native applications such as Office 365, which supports its agile business model.
"At Laing O'Rourke we are ramping up our digital transformation activities and have adopted a cloud-first strategy to aggressively leverage public cloud services,” said Gareth Burton, CIO Europe, Laing O’Rourke. “Where we have chosen to stay on-premises, we are consolidating data centres, introducing hyper-converged private cloud and improving data availability and business resilience across the board.”
A leading construction and engineering enterprise, Laing O’Rourke brings innovation and excellence to the sector through its digital and offsite manufacturing approach. Its project portfolio in the UK boasts projects such as London’s Heathrow Terminals 2 and 5, Crossrail stations and is currently on site delivering the main package for Hinkley Point C.
The company is currently delivering 50 live construction projects across the UK and is headquartered in Dartford, Kent with a number of regional offices and manufacturing hubs throughout the country. Laing O’Rourke’s IT infrastructure consists of three data centres, which are 85% virtualised using VMware’s vSphere and Microsoft’s Hyper-V platforms. It has 350TB of active data, various application data sets including Exchange, databases and web apps, and large unstructured file data.
“The need to move and protect workloads across our hybrid systems, and to introduce data cost management processes meant a data management platform, rather than point products, made sense to manage our large and geographically dispersed data landscape. It was an easy decision to partner with Commvault,” said Burton.
“It was essential that our enterprise cloud strategy included plans for protecting, securing and managing cloud data as well as our on-premises datasets as they are moved, managed and used for business gain. Increasing regulation and cyber security requirements brings another focus to our data management practices. Our data growth is only going in one direction and Commvault has proven itself to be a knowledgeable and valuable partner as we transform to an ever more complex hybrid IT setting."
“Laing O’Rourke represented a unique opportunity to challenge the Commvault Data Platform because of its geographically dispersed workforce and data estate and its desire to be an early adopter of new technology,” said Rob Van Lubek, Area Vice President EMEA North at Commvault. “I am pleased to say that the software has excelled at handling every aspect of Laing O’Rourke’s evolving IT strategy. From enabling cloud adoption to protecting data from any threat. No matter where the data sits, Commvault has not only offered a solution, but also extended Laing O’Rourke’s team with knowledgeable experts and superior support.”
Angel Business Communications have announced the categories and entry criteria for the 2018 Datacentre Solutions Awards (DCS Awards).
The DCS Awards are designed to reward the product designers, manufacturers, suppliers and providers operating in data centre arena and are updated each year to reflect this fast moving industry. The Awards recognise the achievements of the vendors and their business partners alike and this year encompass a wider range of project, facilities and information technology award categories as well as Individual and Innovation categories, designed to address all the main areas of the datacentre market in Europe.
The DCS Awards categories provide a comprehensive range of options for organisations involved in the IT industry to participate, so you are encouraged to get your nominations made as soon as possible for the categories where you think you have achieved something outstanding or where you have a product that stands out from the rest, to be in with a chance to win one of the coveted crystal trophies.
This year’s DCS Awards continue to focus on the technologies that are the foundation of a traditional data centre, but we’ve also added a new section which focuses on Innovation with particular reference to some of the new and emerging trends and technologies that are changing the face of the data centre industry – automation, open source, the hybrid world and digitalisation. We hope that at least one of these new categories will be relevant to all companies operating in the data centre space.
The editorial staff at Angel Business Communications will validate entries and announce the final short list to be forwarded for voting by the readership of the Digitalisation World stable of publications during April and May. The winners will be announced at a gala evening on 24th May at London’s Grange St Paul’s Hotel.
The 2018 DCS Awards feature 26 categories across five groups. The Project and Product categories are open to end use implementations and services and products and solutions that have been available, i.e. shipping in Europe, before 31st December 2017. The Company nominees must have been present in the EMEA market prior to 1st June 2017. Individuals must have been employed in the EMEA region prior to 31st December 2017 and the Innovation Award nominees must have been introduced between 1st January and 31st December 2017.
Nomination is free of charge and all entries can submit up to two supporting documents to enhance the submission. The deadline for entries is : 9th March 2018.
Please visit : www.dcsawards.com for rules and entry criteria for each of the following categories:
DCS Project Awards
Data Centre Energy Efficiency Project of the Year
New Design/Build Data Centre Project of the Year
Data Centre Automation and/or Management Project of the Year
Data Centre Consolidation/Upgrade/Refresh Project of the Year
Data Centre Hybrid Infrastructure Project of the Year
DCS Product Awards
Data Centre Power product of the Year
Data Centre PDU product of the Year
Data Centre Cooling product of the Year
Data Centre Facilities Automation and Management Product of the Year
Data Centre Safety, Security & Fire Suppression Product of the Year
Data Centre Physical Connectivity Product of the Year
Data Centre ICT Storage Product of the Year
Data Centre ICT Security Product of the Year
Data Centre ICT Management Product of the Year
Data Centre ICT Networking Product of the Year
DCS Company Awards
Data Centre Hosting/co-location Supplier of the Year
Data Centre Cloud Vendor of the Year
Data Centre Facilities Vendor of the Year
Data Centre ICT Systems Vendor of the Year
Excellence in Data Centre Services Award
DCS Innovation Awards
Data Centre Automation Innovation of the Year
Data Centre IT Digitalisation Innovation of the Year
Hybrid Data Centre Innovation of the Year
Open Source Innovation of the Year
DCS Individual Awards
Data Centre Manager of the Year
Data Centre Engineer of the Year
The next Data Centre Transformation events, organised by Angel Business Communications in association with DataCentre Solutions, the Data Centre Alliance, The University of Leeds and RISE SICS North, take place on 3 July 2018 at the University of Manchester and 5 July 2018 at the University of Surrey.
For the 2018 events, we’re taking our title literally, so the focus is on each of the three strands of our title: DATA, CENTRE and TRANSFORMATION.
The DATA strand will feature two Workshops on Digital Business and Digital Skills together with a Keynote on Security. Digital transformation is the driving force in the business world right now, and the impact that this is having on the IT function and, crucially, the data centre infrastructure of organisations is something that is, perhaps, not as yet fully understood. No doubt this is in part due to the lack of digital skills available in the workplace right now – a problem which, unless addressed, urgently, will only continue to grow. As for security, hardly a day goes by without news headlines focusing on the latest, high profile data breach at some public or private organisation. Digital business offers many benefits, but it also introduces further potential security issues that need to be addressed. The Digital Business, Digital Skills and Security sessions at DTC will discuss the many issues that need to be addressed, and, hopefully, come up with some helpful solutions.
The CENTRES track features two Workshops on Energy and Hybrid DC with a Keynote on Connectivity. Energy supply and cost remains a major part of the data centre management piece, and this track will look at the technology innovations that are impacting on the supply and use of energy within the data centre. Fewer and fewer organisations have a pure-play in-house data centre real estate; most now make use of some kind of colo and/or managed services offerings. Further, the idea of one or a handful of centralised data centres is now being challenged by the emergence of edge computing. So, in-house and third party data centre facilities, combined with a mixture of centralised, regional and very local sites, makes for a very new and challenging data centre landscape. As for connectivity – feeds and speeds remain critical for many business applications, and it’s good to know what’s around the corner in this fast moving world of networks, telecoms and the like.
The TRANSFORMATION strand features Workshops on Automation and The Connected World together with a Keynote on Automation (Ai/IoT). IoT, AI, ML, RPA – automation in all its various guises is becoming an increasingly important part of the digital business world. In terms of the data centre, the challenges are twofold. How can these automation technologies best be used to improve the design, day to day running, overall management and maintenance of data centre facilities? And how will data centres need to evolve to cope with the increasingly large volumes of applications, data and new-style IT equipment that provide the foundations for this real-time, automated world? Flexibility, agility, security, reliability, resilience, speeds and feeds – they’ve never been so important!
Delegates select two 70 minute workshops to attend and take part in an interactive discussion led by an Industry Chair and featuring panellists - specialists and protagonists - in the subject. The workshops will ensure that delegates not only earn valuable CPD accreditation points but also have an open forum to speak with their peers, academics and leading vendors and suppliers.
There is also a Technical track where our Sponsors will present 15 minute technical sessions on a range of subjects. Keynote presentations in each of the themes together with plenty of networking time to catch up with old friends and make new contacts make this a must-do day in the DC event calendar. Visit the website for more information on this dynamic academic and industry collaborative information exchange.
This expanded and innovative conference programme recognises that data centres do not exist in splendid isolation, but are the foundation of today’s dynamic, digital world. Agility, mobility, scalability, reliability and accessibility are the key drivers for the enterprise as it seeks to ensure the ultimate customer experience. Data centres have a vital role to play in ensuring that the applications and support organisations can connect to their customers seamlessly – wherever and whenever they are being accessed. And that’s why our 2018 Data Centre Transformation events, Manchester and Surrey, will focus on the constantly changing demands being made on the data centre in this new, digital age, concentrating on how the data centre is evolving to meet these challenges.
The European Managed Services & Hosting Summit 2018 is a management-level event designed to help channel organisations identify opportunities arising from the increasing demand for managed and hosted services and to develop and strengthen partnerships.
Previous articles here have reflected on the changes that the managed services model brings customers – their abilities to change their buying model to revenue-based, often to do more with less resources, and then adopt new working tools such as analytics, which just weren’t available at the right price before. The impact on the IT industry supplying those customers has been profound as well, requiring a real re-think of sales processes, built around a continuous relationship with the customer, not just a “sell-and-forget” on big-ticket items.
Obviously, the IT channel is attracted by the prospect of more sales by working in managed services, with the world market predicted to grow at a compound annual growth rate of 12.5% to 2019. But it is such a fundamental change in their structures, that some are thinking it a step too far, even under pressure from customers for the benefits that managed services can bring them. Those partners may find themselves rapidly left behind as the new model becomes the standard in most industries.
This, coupled with the ease of entry into the market for cloud-based solutions suppliers, means that the IT channel is having to face a whole new competitive threat. A business “born in the cloud” has an obvious advantage when trying to sell cloud services to a customer, compared with a traditional reseller – the cloud-based channel “eats its own dog-food”, to adopt a rather unwholesome phrase imported across the Atlantic.
So, in establishing the agenda for the European Managed Services and Hosting Summit in Amsterdam in May this year, the organisers are thinking beyond the obvious GDPR issues which will inevitably be in the headlines as its deadline comes round, and even the ever-popular M&A discussions of company value, to bring out a flavour of the sales-engagement process in managed services. We are asking our leading speakers to examine the business processes of the best managed services companies, to try to identify what makes them tick - and tick ever faster and with wider portfolios.
How is the sales process managed? How are the salespeople rewarded in the revenue model? How do they maintain that ongoing relationship with the customer in a cost-effective way? How do they ensure that the salesforce is motivated and retained in the longer term, while keeping them up-to-date with the latest information on the market, the technologies, and customers issues?
None of this is easy, and many managed service providers, integrators, traditional resellers and even those new and fast-growing “born-in-the cloud” supplier companies still have many questions to put to the experts, and the MSHS Europe is the perfect event at which to do this, with many leading suppliers on hand as well as industry experts.
The MSHS event offers multiple ways to get those answers: from plenary-style presentations from experts in the field to demonstrations; from more detailed technical pitches to wide-ranging round-table discussions with questions from the floor. There is no excuse not to come away from this with questions answered, or at least a more refined view on which questions actually matter.
One of the most valuable parts of the day, previous attendees have said, is the ability to discuss issues with others in similar situations, and we are all hoping to learn from direct experience, especially in the complex world of sales and sales management.
In summary, the European Managed Services & Hosting Summit 2018 is a management-level event designed to help channel organisations identify opportunities arising from the increasing demand for managed and hosted services and to develop and strengthen partnerships. More details: http://www.mshsummit.com/amsterdam/
Data protection of digital data is a fundamental and mandatory responsibility for all organizations. Therefore, organizations need to understand the basic principles and concepts of data protection, especially in our current era of massive data breaches. To satisfy that need, the Storage Networking Industry Association (SNIA) has developed a technical whitepaper to provide the industry with a vendor-neutral overview of the relevant best current practices for data protection at the storage level.
By Thomas Rivera, SNIA Data Protection and Capacity Optimization (DPCO) committee chair.
Data protection is traditionally viewed as the execution of backup operations that are assured of providing data recovery if a loss of the original data (production data) occurs. In fact, data protection encompasses much more than backups and recovery techniques, such as dealing with issues related to data corruption and data loss, data accessibility and availability, as well as compliance with retention and privacy rules and regulations.
There are many factors to consider when it comes to data protection at the storage level. The main areas fall into three data protection “drivers”. These are data corruption and data loss, accessibility and availability, and compliance. Protected data must meet intended uses for all three drivers. Preventing data corruption and data loss ensures that the data is what the organization expects it to be when the data needs to be used. Accessibility and availability relate to the data being made available in a timely manner for intended uses. Compliance ensures that the data usage meets all legal and regulatory requirements.
Data Corruption and Data Loss
Data must be protected both logically (to prevent data corruption from hacking or other external threats) and physically (in the case of data loss or the irreversible failure of a storage device). Physical prevention of data loss from hardware failure on a random-access storage system can use techniques such as RAID or erasure coding.
Backup and recovery are two of the traditional cornerstones to data protection for both physical and logical reasons. Backup relates to the processes of providing a copy of the data at a point in time and recovery refers to the ability to restore data for intended application use according to the organizational SLAs. One approach on a storage system itself is through the use of snapshots. These snapshots may serve as the basis for the data that is copied to a backup target storage system. Other approaches include the use of continuous data protection or to use a public or private cloud as a backup service.
Cloud backup refers to backing up data to a remote, storage-as-a-service (public, private or hybrid). A cloud backup service is not a pre-defined, fixed solution and must be considered in the overall context of a business data protection or disaster recovery strategy. Cloud-based backup appeals to many businesses because it offers a low-cost way to protect business data off-site but there are many different considerations to be aware of when planning such an implementation.
Moving the business data into the cloud is the easy part. Getting it back when you really need it is when things can get challenging. For this reason, it is important to understand all the imperatives before embracing public cloud-based data backup as part of the data protection and disaster recovery strategy.
When considering backup to a cloud provider, it is imperative to define the business requirements. These requirements may include business demands, SLAs and Quality of Service (QoS) levels for the backup data, along with the required skills for deployment of the cloud-based backup technology. It is also important to pay careful attention to the network design that will connect the cloud service provider to your data center. What network currently exists, what security does it offer, is there enough bandwidth and redundancy, what is the latency, etc.
Replication and mirroring are also used to make copies of data. Replication refers to point in time copies whereas mirroring provides for continuous writing of data to two or more targets. Replication may be used for both physical and logical data protection while mirroring is a physical data protection approach.
An archive is an official set of more or less fixed data that is managed separately from more active production data. As such, copies have to be made for data protection purposes, but more active measures, such as standard backup or mirroring are not necessary.
Accessibility and Availability
For accessibility and availability, Business Continuity Management (BCM) includes the processes and procedures for ensuring ongoing business operations. One key aspect of BCM is Disaster Recovery (DR), which involves the coordinated process of restoring systems, data, and the infrastructure required to support ongoing business operations after a disaster occurs. But a BCM plan also includes technology, people, and business processes for recovery.
As part of accessibility and availability, basic infrastructure redundancies need to be provided, including UPS systems to provide redundancy for power in case of a power outage and extra network and power connections.
Compliance
Compliance includes the application of specific technologies that allow for the ability to secure data for meeting the appropriate rules and regulations typically related to data retention, authenticity, immutability, confidentiality, accountability, and traceability, as well as the more general problem of data breaches. There are a number of technologies that relate to compliance including:
The Two Sides of Data Protection
Data protection is an important component of any Information Technology (IT) system, and the methods used for data protection and how they are configured have important inter-relationships with other aspects of the data center. By its nature, data protection has two sides, the backup or replication side and the restore or recovery side.
The backup side of data protection is the process or processes performed on a regular, or even a continuing basis to create one or more copies of an organization’s primary data at a particular point in time. Backup processes may well differ from one type or subset of data to another, and they must be chosen with care to minimize the impact on the availability of primary data to all applications and users that need it. The backup must also provide for recovery of data in the way prescribed by the organization’s Service Level Agreements (SLAs) with regard to each set of data. Thus traditional daily backups (copies of data to a different media) may be used for some subsets of data, while a real-time mirroring process may need to be used for other, highly critical data sets, in order to facilitate faster restores of the data.
Successful recovery operations are the result of having put appropriate backup processes in place, and recovery of lost or corrupted data is vital to an organization’s health. A recovery operation may be required just to replace a file that a user accidentally deleted or a corrupted set of data (operational recovery), or to replace a major portion of a data center or an entire data center, in case of a disaster such as a multiple device failure, a virus or denial of service attack, or the destruction of a data center by a fire or flood (disaster recovery). There are two important considerations or objectives for data recovery that in turn determine how it needs to be backed up; they are the Recovery Point Objective (RPO) and the Recovery Time Objective (RTO). RTO and RPO are important factors in deciding what backup or replication strategy the business needs to use, and they need to be a part of any organization’s SLAs with regard to data protection.
Data Protection and Digital Archives
Although a digital archive represents another set of copies of primary data like those intended for backup or disaster recovery (DR), an archive is more immutable in nature, with changes either not allowed, or strictly controlled by a journaling process. Also, archives themselves require data protection; they are not intended to be used for data protection.
Archives may be divided into two types based on their intended longevity, with those intended to last more than ten years being considered a long-term archive5. Long-term archives typically require different methods for storage, security and management.
About the DPCO
For more information about the relevant best current practices for Data Protection, please feel free to download the complete technical white paper here: https://www.snia.org/education/whitepapers. This technical paper includes the many factors to consider when it comes to data protection at the storage level. For more information about the work of the DPCO, visit: http://www.snia.org/dpco.
Our new digital existence – driven by mobile phones, the internet, sensor technology and other intelligent devices – brings many benefits, but also significant concerns. The rapid growth in data created, captured and analysed can help in planning our future world, encourages innovation, enables the development of better services and provides greater convenience.
By Praveen Kumar, general manager, Asia Pacific, ASG Technologies.
However, one question remains uppermost in consumers’ minds—is my information safe? The challenge faced by any organisation holding our data is how to navigate and manage these unprecedented volumes – which are multiplying at historic rates – while still protecting the privacy and security of every customer.
Yet, every week brings news of another security breach. Inevitably, criminals and cyber-terrorists have been quick to recognise the opportunities presented by the ocean of data available to them. As a result, much of world’s regulatory authorities have responded by creating rules that formalise the steps enterprises must take to protect both customer and enterprise data.
A key upcoming piece of legislation will force enterprises to develop new approaches to information management – the European Union’s General Data Protection Regulation (GDPR). Slated for mandatory compliance by May 25, 2018, GDPR places significant requirements across all organisations collecting data on European residents to closely manage and track the personal information they collect. The rules affect every entity both inside and outside of Europe that holds or uses personal data of covered individuals.
Every business will need to prove it handles personal data properly. Among other requirements, it will be necessary for companies to show consent to use data collected when required, delete data or correct errors and provide copies of data when asked. To fulfill these requirements, it will be vital to track all uses of personal data and protect the privacy of the individual.
To help achieve this, every company housing personal data collected on European residents will benefit from using an enterprise data lineage solution. These solutions can provide quick lineage reports of the source and use of data through the organisation and provide on-the-spot auditing of all data flagged as personal. Without a data lineage solution, or something like it, your company may find itself halting business to provide manual reports to regulatory bodies.
Businesses in Singapore will likely be the most affected in Southeast Asia, since the country is the EU's largest commercial partner in ASEAN, accounting for slightly under one-third of EU-ASEAN trade in goods and services.
Complying with the GDPR and similar regulatory requirements such as Singapore’s PDPA is a significant challenge, not least because enterprises have typically locked up vital information in departmental silos, spread across legacy and modern systems ranging from 40-year old mainframes to on-premises storage and the cloud.
A Forrester survey commissioned by ASG Technologies found that one of the key challenges identified by the enterprise architecture and operations professionals surveyed, is dealing with their firms’ legacy storage or disconnected content management systems. Twenty-five percent said their ability to move content to the cloud is hampered by their existing infrastructure. Typically enterprises are adding to their technical base or technologies supported, rather than replacing them.
Clearly, businesses need to identify and deploy solutions that span traditional and new technologies, enabling them to seamlessly access their data, track its lineage across data warehouses and through transformations while maintaining the necessary information to support governance of personal data in order to demonstrate GDPR compliance.
The costs of understanding and utilising the mass volumes of data in this complex environment are significant, but the price of using inaccurate data for decision making, failing a compliance audit, or a experiencing a security breach is much more so, not only from the cost and lost opportunity aspect but also because of the impact on enterprise reputation.
The bonus for enterprises that address their compliance issues through the deployment of a dedicated tool-agnostic data management solution is their ability to support a deep view into an enterprise’s most valuable data. Accurate representations of the data estate will support making critical decisions faster, providing business agility that will drive immediate results and helping to build new offerings for customers.
This is not an issue that will pass with the GDPR deadline. A new digital world beckons, but businesses can only enjoy the benefits if they treat their data with respect, understand how it was collected, how it is used and are confident of its integrity. When this is the case they will be able to address any further compliance, while optimising the value of their data to remain competitive in this changing world.
For many organisations, the new European General Data Protection Regulation requires the appointment of a Data Protection Officer. While filling this important post can extremely difficult, a rare breed of data management-savvy cloud providers have what it takes to fill the gap.
By Sophie Chase-Borthwick, Global Lead – GDPR Services, Calligo.
The advent of the European General Data Protection Regulation (GDPR) on May 25 this year poses a very important question for all organisations – do they need to appoint a Data Protection Officer?
Meeting the obligations of any new data regulation has always been onerous, but GDPR is about to raise the stakes substantially. Covering all the personal data of EU citizens anywhere, it requires many organisations to appoint a Data Protection Officer (DPO). But this is a complex and entirely new role and many organisations could still be working out whether they need one, what the job involves and who can do it, until it is too late.
Who needs one of these new officers? Most organisations do
Before the technicalities of the role are even considered, many businesses will be poring over the new regulation to discern whether the DPO requirement covers them. Article 37 of the GDPR specifies that DPOs must be appointed by the following: all public authorities, with the exception of courts; any organisation carrying out systematic monitoring of individuals on a large scale; organisations where the core activities involve the processing of data relating to criminal convictions and offences or so-called “special categories” such as genetic data, health data, racial origin or sexual orientation.
Although this will excuse many organisations, even where the GDPR does not specifically require the appointment of a DPO, the ICO and other enforcement bodies regard the creation of the post as a matter of good practice. In addition, any organisation deciding it does not need a DPO should consider how long it will stay on the right side of the regulation. After all, even if an organisation does not need a DPO, it must still fulfil the same responsibilities – meaning a decision not to appoint a DPO can actually make fulfilling GDPR obligations harder.
Who is qualified to oversee this huge range of responsibilities?
A DPO’s responsibilities are enormously wide-ranging. In short, it involves supervising all data within a business that is subject to GDPR rules. But this simple definition hides the mammoth scope of the task. It will include monitoring the collection of data, justifying its possession, assuring secure storage, auditing vulnerabilities and in many cases overseeing deletion of valuable material.
This breadth begs another question – who is qualified to be a DPO? The requirements can make the DPO’s job description sound like some kind of data protection superhero, capable of translating legal requirements into both processes and technical needs, overseeing awareness-raising and staff training, all while empowering not restricting the company’s wider vision.
And in case this wasn’t enough, it stands to reason – and is in fact specifically mentioned in the GDPR guidelines – that the more complex or high risk the data processing activities are, the greater the expertise of the DPO will need to be.
Information security personnel should not be appointed
This is a demanding set of responsibilities and often the confusion between privacy and security means they are handed to those responsible for security, which is the wrong approach. This is because anyone with an Information security remit is charged with protecting the company and its data, whereas the responsibility of the DPO is to protect the interests of the data subject, even if these appear to clash with those of the company.
For a DPO there should be no conflicts of interest with any other activities in the organisation and if a breach occurs, a report must go to the authorities – it cannot be a matter for debate.
Choosing the right alternative
When the role is reviewed, it is a miracle anyone wants to be a DPO. This hardly makes recruitment easy, especially as the GDPR deadline approaches and qualified personnel are in short supply.
The complexities, the demands of the job, the skills shortage and the cost of appointment – all these factors will inevitably lead to a more pragmatic approach where organisations rely on external expertise: a “DPO-as-a-service” concept.
There is a range of possible options for such input, ranging from lawyers to management consultants. But despite what many of these service providers may claim, meeting ongoing requirements is not solved with an audit and list of recommendations. This will enable only a quick fix, and not ongoing observance. This requires a far more rigorous understanding of the way data comes into and moves through a business, including, but certainly not limited to, the technology involved.
Indeed, where an organisation is advanced enough to have already moved to the cloud, the additional dynamic to data usage that the cloud brings would logically make cloud providers the more suitable partner – provided they are not only cloud experts, but also genuine specialists in data management.
Many cloud providers purport to accommodate data management and privacy in their services, but few in fact have the history, knowledge or expertise to back it up. For many, data management is considered almost an add-on, in much the same way as an additional service such as back-up or disaster recovery.
In fact, the only cloud providers that are suitably qualified to offer data management – and DPO-as-a-Service – are those that have built their original services around data management principles rather than the cloud basics of flexibility, uptime and scalability.
This is a specially-qualified group that is capable of simultaneously advising on and implementing cloud strategy while leaning on long-standing experience in data management and the surrounding legislative frameworks, including GDPR. It is only this rare class of provider that can offer immediate access to consultancy along with the sophisticated tools that assist in fulfilling obligations on a day-to-day basis.
Businesses are rightly concerned about the whole question of DPO functions and roles, and are naturally seeking external support before GDPR comes into force in May. It is only natural that where a business has its data in the cloud, advice on data management is sought from its cloud provider. But that should only be taking place if that provider genuinely has the track record, focus and expertise that qualify it to offer such consultancy or services.
The digital transformation wave is causing widespread change across businesses in all industries, with the creation of new business models, revenue streams and innovative services.
By Jason Bobb, Senior Vice President of Global Sales and Business Development, Canonical.
Leveraging the capabilities of modern technology is now a question of survival. The world is full of examples of businesses that have been slow to adapt and got behind the curve of innovation and slowed the pace of growth. Equally, those that have risen to the top after finding a way to exploit the potential of ‘digital,’ the most popular examples being the likes of Netflix and Airbnb.
Rising consumer expectations, combined with increased competition and rapid technological advancements, have led to an environment where if you think you’re standing still, you’re probably actually falling behind.
This means organisations must constantly be looking at how they can exploit new ways of working and deliver new services for customers by the widespread move to cloud-based architectures.
To achieve these goals, business leaders have come to realise that the IT department has a vital role to play. And, for forward-thinking companies, this role is no longer consigned to just supportive, back-office functions. The position of IT has been elevated significantly in recent years, now enjoying prominence and strategic recognition across the entire business.
From cost centre to business leader
There was a time when the IT department was largely viewed as a cost centre, a necessary function that performed a valuable role but wasn’t regarded as being central to the business in terms of revenue growth or strategic direction.
Traditionally, it was the department of people who were called upon to solve their tech-related questions and deal with administrative issues.
Fast forward to 2017 and things have changed significantly. IT teams are now crucial to any business’s future ambitions and have a vital role to play in not only enabling innovation, but shaping strategic direction. Indeed, a massive 97% of respondents to a digital transformation survey carried out by Brocade acknowledged that IT departments are important to enabling the organisation to grow and innovate.
The department is now widely represented at boardroom level and IT teams are often being tasked to lead and inform business strategy. They understand the strategic value of IT related to the industry they are in and are working side-by-side with management to meet business demands.
The simple reason for this is that technology has changed the way businesses work. Most employees now use multiple devices daily, internal business processes are largely controlled through IT applications and customers expect to be able to communicate with organisations through several digital channels, simultaneously.
The common thread through all of this is IT. From network expansion and data management to cyber security and application development, modern IT departments have a hand in virtually all the essential cogs that keep organisations up and running.
Growing in influence
In today’s new world of business, IT is well and truly leading the charge and recent research from Spiceworks has highlighted the increased level of influence IT professionals are enjoying across organisations.
The study revealed that IT Decision Makers (ITDMs) have more influence on major infrastructure purchases than business decision-makers, with 80% evaluating and recommending technology solutions compared to 40% of business decision makers (BDMs). They are also a more trusted source for strategic insight. When it comes to purchasing decisions, BDMs in EMEA rely more on insights from ITDMs (66%) than from their business peers (44%).
Ultimately, IT decision makers have more purchase influence than business decision makers for nearly all technologies, a clear indicator of their growing presence and the vital role they now play in shaping strategic direction.
Infrastructure and innovation
Given that businesses must now be digital to thrive and survive in today’s landscape, the importance of having access to a robust IT infrastructure cannot be underestimated.
Organisations are relying on their IT departments to be able to develop – or partner appropriately to deliver – an infrastructure that enables them to leverage the capabilities of technologies such as cloud computing (be that public or private), virtualisation and the Internet of Things, all of which have quickly become essential for business success.
The growth of ‘Big Software’ provides a perfect example of the demands being placed on modern infrastructure. Businesses today are relying on the likes of machine learning, big data and OpenStack architectures to stay ahead of the competition and the level of configuration required is unprecedented combined with the need to innovate rapidly and at scale, efficiently managing an infrastructure that enables this is essential.
The amalgamation of these trends has prompted a shift in focus for IT teams. The emphasis is now less geared towards just keeping the lights on and more towards building infrastructure that enables innovation. Whereas in the past IT departments would spend their time thinking about how IT works, they are now more concerned with how IT can support the business to solve specific business problems and meet customer requirements.
Essentially, the distinction between business and IT is becoming ever-more irrelevant. IT is now the businesses in the same way that the business is now IT.
Appointing a Data Protection Officer to comply with the GDPR – should you consider an outsourced service as an alternative to an internal appointment?
By Tim Wright, Partner and Steven Farmer, Counsel in Pillsbury’s Global Sourcing & Technology Transactions group.
It’s hard to avoid headlines relating to or talk of the GDPR these days, with its effective day, 25 May 2018, approaching rapidly. The GDPR, or the EU General Data Protection Regulation (Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC), to give it its full name, will bring in a new approach to data protection regulation across the EU with a marked focus on self-regulation and internal accountability. In a change from the current data protection regime, the GDPR will impose specific legal obligations on data processors for the first time, as well as increased obligations for data controllers.
Who must appoint a DPO?
In almost all cases, public authorities will have to appoint a DPO. In addition, controllers and processors must appoint a DPO where their core activities involve, on a large-scale, regular and systematic monitoring of individuals or processing of sensitive personal data, or data relating to criminal convictions and offences.
Organisations will need to consider these concepts (i.e. “core activities”, “regular and systematic monitoring”, “large-scale”) in the context of their own processing operations (current and planned for the future). Although the terms are not defined in the GDPR, they are explained in more detail in Guidelines published by the Article 29 Working Party, the independent EU data advisory body. Examples of large-scale processing include:
The Guidelines state that unless it is obvious that an organisation doesn’t need to appoint a DPO, it should document the internal analysis carried out when deciding to appoint a DPO or not (part of the documentation required by the GDPR’s accountability principle). Since failing to appoint a DPO when required could result in potentially very large fines, having that documentation available if requested by the supervisory authority is important. The analysis should be repeated, and documentation updated, if the organisation introduces a new service or a makes a change to the processing being carried out.
There are no mandatory qualifications to be a DPO, although various accreditations can be sought, such as Certified Data Protection Officer Certification. Each DPO will need appropriate professional qualities and expert knowledge of data protection law, although the level of expertise will depend on the complexity of the business’ processing activities. The DPO will need to understand the actual processing activities carried out, related information technology and data security issues, and be able to promote a data protection culture. The decision-making that led to the appointment (and any subsequent changes) should be documented (as part of the wider accountability obligations). Businesses that are not required by the GDPR to appoint a DPO may nevertheless voluntarily appoint a DPO, although this means that they must comply with the full range of DPO-related compliance obligations.
The role of the DPO
The DPO’s main tasks and activities (which are described in Article 39 and should be carried out with “due regard to the risk associated with processing operations, taking into account the nature, scope, context and purposes of processing”) include providing information and advice to the organisation and its employees about their obligations under GDPR, and monitoring for GDPR compliance. Other activities include managing internal data protection activities, carrying out awareness-raising and staff training, assisting with data protection impact assessments and conducting internal audits. The DPO will be the first point of contact for supervisory authorities and individuals (customers, employees etc.) whose data is processed.
Duties of the employer
The role of the DPO is not an operational one, but instead involves monitoring for compliance and providing advice in an autonomous and independent manner. The business must not instruct the DPO on how to perform his or her role and the DPO must be able to operate above any conflicts of interests that occur within the business, with internal rules and safeguards to facilitate this.
External appointment
The DPO can be either a staff member or the role can be outsourced to a contractor. Documenting the decision-making process and criteria applied in the selection of the DPO is also a GDPR requirement. In considering the different offerings in the market, as well as determining whether to outsource or not, a number of factors should be considered, such as the size, and nature, of the organisation; the existence of internal competences (including the ability (or otherwise) to ring-fence the DPO away from any conflicts that may arise); the categories of personal data processed; the complexity of the processing, digital transformation and automation plans etc. As with any outsourcing, it is important to allocate sufficient time to assess the market and to conduct adequate due diligence on the shortlisted providers.
Any contractor DPO must have a service contract. The contract should address independence and conflicts of interest, and contain a clear description of the tasks and responsibilities to be performed, including the nomination of a lead individual where the tasks are to be performed by a team. The contract should also state how access to relevant expertise will be guaranteed as well as pricing, service levels, reporting and exit support.
Final Remarks
Whether the decision is made to appoint a staff member or a contractor, the position of DPO in any organisation will be an important appointment to which adequate time and diligence should be devoted. Ongoing, as required by Article 38 of the GDPR, the DPO (or a member of her team) should be “involved, properly and in a timely manner, in all issues which relate to the protection of personal data.”
It’s often said that there are two types of forecasts – lucky or wrong. Predictions remain that supercomputing systems will not reach exascale level (i.e. systems capable of being measured in exaFLOPs which is a billion billion calculations per second) for another five years or so. But, this is not the case when you are talking about the readiness of storage systems that can support exascale.
By James Coomer, Vice President for Product Management, File Systems and Benchmarking at DDN Storage.
The kind of storage architectures that will support these environments are already here; and being utilised in the high-end cloud and supercomputing world. Certainly, from a storage architecture point of view we are well beyond supporting petascale (current generation) systems.
Exascale is just for national labs…
…well no. Firstly we need to define exascale. Literally, it refers to floating point calculation rate, but more broadly it refers to the whole environment that supports such a large compute system, from the applications running atop the compute, through to the storage that manages the data flow to and from the compute. The application of exascale is certainly not just for labs. Just like a space program, the benefits of research and investment into these massive scale national supercomputers are felt well beyond the program itself. Although supercomputer use cases at exascale have been, and will continue to be, national lab based, the impact of exascale will undoubtedly change the face of both the wider High Performance Computing (HPC) sectors and furthermore, business analytics and machine learning.
From weather forecasting, medical research, cosmology, and quantum mechanics to machine learning and AI, exascale storage systems have an application. Simply put, any sector with massive amounts of data that needs to be analysed concurrently at extreme rates will benefit from exascale technology for years to come.
Exascale in the enterprise - is the compute letting down the storage?
Enterprise use cases for exascale-capable storage systems expose a lot of challenges across the board in algorithm design, network architecture, IO paths, power consumption, reliability, and so-on. One of the major areas of concern in the application of supercomputing, machine learning or analytics, is the ability to perform a huge array of tasks simultaneously with minimal disturbance between tasks. Otherwise known as concurrency, this parallel execution is critical to success.
In contrast to previous major supercomputing milestones, exascale will not be reached by increasing CPU clock speeds, but rather through massive core counts enabled by many-core and GPU technologies. However, when you increase core count, the applications must increase in thread count to take advantage of the hardware and this in turn builds a concurrency-management problem which can be a real headache for enterprise datacentres and cloud providers, particularly when it comes to I/O and storage management.
Unlike the national labs, rather than managing one monolithic supercomputer, often running a single “grand challenge” application at a time, enterprise data centres are faced with general workloads that vary enormously with massive thread counts and highly varied request patterns all stressing the storage system at any one time. So, what you really need is a new storage architecture that can cope with this explosion in concurrency across the board.
Traditionally HPC applications have required a lot of attention from algorithm developers to ensure that I/O patterns match well the specific performance characteristics of storage systems. Long bursts of ordered I/O from a well-matched number of threads are well-handled by storage systems, but small, random, malaligned I/O from very large numbers of threads can be catastrophic for performance. As we move to exascale every component of the architecture must do its part to address issues like these allowing application developers to focus on other areas for optimisation and scaling.
Changing I/O in the exascale generation
Data-at-scale algorithms are also changing as the workloads that they are handling are transforming - the heightened use of AI across enterprise sectors, in machine-learning for self-driving cars and real-time feature recognition and analytics introduce very different I/O patterns than we are used to seeing in the supercomputing world. Now, I/O is characterised not by an ideal, large I/O, sequential access, but rather a complex mixture of large, small, random, unaligned, high-concurrency I/O in read-heavy workloads, which require storage to provide both streaming performance, high IOPS and high concurrency support.
The key to success as we utilise exascale storage systems will be in the inclusion of systems that can handle the stress associated with this new generation of operation with many core systems and the new spectrum of applications that display very diverse I/O behaviours.
The secrets behind a exascale storage architecture
HPC burst buffers certainly have their place in addressing this problem. Originally conceived to assist supercomputers in dealing with exascale issues of reliability and economically viable I/O, burst buffers were originally intended as an extreme performance, Flash-based area for compute nodes to write to.
At DDN, we started addressing the challenges of exascale systems around five years ago by developing a sophisticated layer of software that manages I/O in a very different way. We wanted to bridge the chasm between the application and new, solid state ultra-low latency storage devices to fundamentally address the sub-microsecond latencies which were emerging. And, unlike classic flash arrays, to do so at supercomputer (or cloud) scale. Furthermore, we wanted to support not just the limited supercomputer use cases, but instead create a system which could fundamentally do I/O better right across the board.
HPC burst buffers can make exascale I/O a reality today, and enable enterprises to run HPC jobs with much greater speed and efficiency by overcoming the performance limitations of spinning disk. By speeding up applications you can run more jobs faster and in parallel – all very well.
DDN’s Infinite Memory Engine (IME) goes quite a long way further. IME is a software defined storage service that introduces a new tier of transparent, extendable, non-volatile memory (NVM), with game changing latency reductions and greater bandwidth and IOPS performance for the next generation of performance-hungry scientific, AI, analytic and big data applications.
IME eliminates locking limitations and other filesystem bottlenecks while reducing 70% of storage hardware. When you have a very large dataset and a lot of compute, a performant system on paper can easily become gummed up by the internal mechanics of a (parallel) file system in performing lots of filesystem operations and remote procedure calls (RPCs) – due to the indivisibility of concurrency mechanism and deterministic data placement. With IME, we replace these traditional data and control paths with a new, flash-era paths that expose the IOPS of the underlying media directly to the applications – removing those bottlenecks.
The evolution continues…
The evolution in enterprise data-at-scale will continue to move forward at a significant pace. While most data intensive organisations started off on NFS servers, then moved to scale-out NAS systems, and for tougher workloads used Parallel FileSystems, these enterprises will now need to embrace the new generation of high performance storage architectures to handle the explosion of data-intensive applications and take advantage of flash. This can be achieved, and at massive scale by taking advantage of the many lessons learned from building exascale storage systems, and deploying the new generation of data platforms built for the flash era.
To be a successful business in Europe in 2018, you will need to be able to demonstrate GDPR compliance. Numerous articles have already detailed what GDPR is, the ramifications of failing to comply with it, and many of the steps towards compliance. If there is one thing you know for certain, it is that there is a lot to take in, research to be done, and no easy route to fast-track compliance.
By Frank Krieger VP of Governance, Risk and Compliance, iland.
So let’s take it one step at a time. Here I’ve outlined a chief aspect of GDPR, the aim being to help your business take another leap towards full comprehension of and compliance with GDPR. Specifically, my objective here is to analyse the GDPR requirement around having the unambiguous consent of a data subject.
GDPR will transform how an organisation controls data. Under GDPR, an organisation must obtain the explicit consent of a data subject in order to store, access and process any personal data. Protected data will fall into different categories. Personally identifying information such as names, birth dates, photos, email addresses, bank details, and even IP addresses will naturally fall into the general category of “protected data.” However, data which reveals the essence of someone’s personal life will be even more stringently controlled. This data, such as biometric records, religious or political views, and sexual orientation, will fall into special categories of protected data. Creating a classification scheme for data is a good first step towards GDPR compliance. For example, the classifications of public, internal, confidential and regulatory. From there, you can identify risk, technical safeguards and access controls.
How will the data subject give consent?
Data protection under GDPR will expressly impact marketing and sales operations. In the past, prospective or existing customers have only been given the option to “opt out” of marketing campaigns that target them. Now, with the introduction of GDPR, potential data subjects will always be required to “opt in” and voluntarily disclose their data before the data can be accessed and used. The collection and resell of data will be strictly controlled, with the priority being that the data subject’s consent is clearly present at each step of the process, especially when the data is changing hands. Then, the data subject has the right to review their data, and to ask for it to be entirely purged from the system at any time. The subject could even choose to monetise the use of their personal data. All in all, this gives an overview of the data subject’s rights under GDPR.
How will the organisation obtain consent?
GDPR will also introduce massive changes when it comes to the role of the business that is controlling or processing data. On the topic of consent, GDPR will drastically change how and why organisations are permitted to collect and process data. In the past, companies have been able to accumulate and resell data to suit their needs with little regulation. Now, organisations will need to present a business case and define the legal reason for data collection. It is important to limit the scope of the data being gathered.
If the organisation can present a valid case for processing the personal data of given subjects, it must then communicate this case to the data subject in plain terms. That is, no more legal jargon or 20-clause forms with tick-boxes at the bottom. GDPR will make it so that the data subject’s consent must be requested in clear, simple language.
The organisation must establish strict regulations to dictate which people they collect data for, whether they disclose the data and, if so, to which parties. The risks of disclosing data to third parties must be considered. Regulations must also determine how long the data is retained for and why, as well as what special purposes would warrant data removal.
Managing data
When processing data on behalf of customers, you must align with security and regulatory processes. Under GDPR, you must also be completely transparent, and keep the data subject informed of all processes. Data mapping can be used to monitor how data is flowing, and can also be referenced when establishing and tightening access controls. That is, you can identify the systems and applications that are consuming data, and the individuals who have access to it. Also, they will help you create audit reports to validate adherence. The most important thing is to monitor your data processing carefully.
Additionally, you can create geographical view maps. The purpose is to outline where data is housed and where protected data resides, and helps identify cross border data flows. Network, application and system views are important for any organisation. This is to monitor the flow of data through systems and processes. Also determine which departments and people are controlling and processing protected data.
As a business, you can secure the help of a data protection officer who is responsible for making sure that the data subject’s rights are withheld. Large organisations with the capacity to hire a full-time data protection officer are likely to have an easier time weathering GDPR. However, as a small or medium-sized business with a limited budget, it can be challenging to hire the services of a data protection officer to guide you through the process. Here at iland we can arm you with the necessary tools to ensure compliance in the cloud, from meeting GDPR regulations to enhancing security and more. And as we prepare ourselves for GDPR, so we are keen to share our findings and best practices with other organisations.
Business continuity incorporates pre-emptive measures such as cyber-defences to minimise risk, proactive tactics such as system backups in case a problem arises and plans for a reactive strategy, which should include disaster recovery (DR), ready in case the worst happens.
But in the wake of disaster, how do businesses continue with everyday operations? Paul Blore, Managing Director at Netmetix, explores the options available to organisations and how best to utilise them.
Business continuity
Traditional on-premise backup systems use removable media in the form of tapes or disk drives to store backup data. But this often means designated employees are required to manage and shuffle the backup media every day and preferably, take a copy offsite for safekeeping. The relatively high level of manual intervention can lead to errors being made, resulting in failed or incomplete backups. The removable media is typically a consumable and needs to be replaced at regular intervals, which can be costly, especially for larger capacity backups and media.
Beyond simple backups, conventional Disaster Recovery is a much more complex and costly proposition and typically requires a duplicate set of all the critical systems installed at a remote location, ready to step in if disaster strikes at the primary location. Many businesses have other concerns when it comes to backups and DR so it’s easy to see why organisations would question spending often serious budget on ‘what if’ technology that may never be needed. But what if disaster does strike?
Cloud based DR
Cloud technology has drastically reduced storage costs and has made backing up entire systems much more cost-effective and straightforward. All of the leading cloud providers – Microsoft, Amazon and Google - now offer backup as a core service of their cloud offerings, and clients can generally select whichever backup schedule and retention policy they wish to utilise.
Cloud computing also addresses the DR requirement. Major cloud service providers employ large-scale resilience and redundancy to ensure their systems remain operational. In the unlikely event an entire data centre goes down, client systems could operate from a second data centre. Most providers will also be able to backup on-premise systems and store that data in their cloud-based storage with the same freedom to define schedule and retention. However, the very best systems can also provide a full DR service for on-premise systems by replicating on-premise data in almost real-time into the cloud. Then, if disaster strikes, the systems can automatically allocate computing resource e.g. CPUs, RAM etc. and “spin-up” virtual servers to seamlessly take over until normal service is resumed on-site. Once the disaster has passed, the cloud systems will “fail-back” to the on-premise systems and synchronise all data that was changed during the disaster window. This means that when it comes to defining a DR strategy, businesses now have far more options available, with genuine DR systems now a cost-effective possibility for SMEs.
The SME
SMEs in particular are starting to discover the advantages of utilising cloud-based DR strategies. For businesses that may not have significant budget set aside specifically for IT resource, cloud-based solutions hold the key to successful adoption. Operating on usage-based costings, this type of system is ideal for cloud DR as the secondary or replicated IT infrastructure lays in wait until it’s required and businesses need only pay for it when, or if, they need it. Without the need for physical storage in data centres, smaller businesses are able to deploy their own Disaster Recovery strategy, making it no longer just for the larger enterprises.
So, what now?
Although business continuity should be a priority for businesses, in traditionally ‘offline’ industries, organisations often see IT decisions as tactical rather than strategic. Businesses will cease to function at full capacity if a disaster strikes and the necessary business continuity procedures are not in place; and as a direct result will experience a significant increase in down time and expenditure.
If it isn’t already, business continuity must become a priority for organisations. It’s now easier than ever to migrate to the cloud and take advantage of the inbuilt backup and disaster recovery options available. With the rate of cyber attacks on businesses of all sizes increasing significantly, no company is immune from the threat of hacking, human error or natural disasters and there is no longer an excuse to not have these systems and procedures in place.
Organisations today are embracing technology at an ever-greater pace to better equip, enable and empower their business for the future. From the creation of new business streams, to developing a competitive advantage, the CIOs role is becoming increasingly pivotal in defining the businesses needs and implementing the technology to facilitate change. In an environment where technology is developing and maturing at an exponential rate, CIOs must always be looking ahead – so what does 2018 have in store?
By Neil Bramley, B2B Client Solutions Business Unit Director, Toshiba Northern Europe.
Undoubtedly, one of the key dates for next year is the deadline for GDPR compliance (25th May). Never has it been more important for European organisations to identify ways to best manage their data. Yet, faced with the risks of penalties and even legal action, worryingly many are underprepared with over 50 per cent of companies expected to not be in full compliance before the end of 2018.
In addition, organisations will continue to be faced with an exponential growth in cybercrime. Ransomware damages alone were expected to rise fifteen times from 2015, hitting $5bn globally this year. Simultaneously, across Europe, technology is fuelling a transition towards mobile and remote working. Ushering in a new era of productivity, and bringing with it new security challenges.
To address the increasingly complex regulatory environment, the growth in cybercrime, and adoption of mobile working practices, we expect to see a growth in the acceptance and education of some specific technology trends. These include: quantum cryptography, Edge Computing, and cloud-based virtual desktop infrastructures.
Quantum cryptography
The basic building blocks of computing are set to morph from maths to physics in the future with the introduction of quantum computing. Global Industry Analysts forecasts its global market to reach $2 billion by 2024, a growth which is primarily driven by a constant need for the most secure online data transmission possible. From this, quantum cryptography is emerging as a highly-evolved protection method, necessary to combat ever-increasing security threats. This method can produce a message unreadable to all except its specific, intended recipient, called quantum key distribution (QKD), whereby “keys” are distributed as photons, usually light rays, which if intercepted will immediately change state rendering itself unreadable.
Recently, Toshiba made a breakthrough with quantum cryptography at its Cambridge Research Laboratory by creating the world’s fastest QKD device. Attaining a speed of 13.7Mbps per second – roughly seven times faster than Toshiba’s previous record speed of 1.9Mbps – this breakthrough brings the practical utilisation of quantum technology one step closer to the wider global community.
Edge Computing
With data proliferation coming from the rise of IoT and the predicted capabilities of 5G in 2018, Edge Computing will become ever more vital. For organisations that handle large amounts of data, deciphering what to send to the cloud can reduce backlogs allowing it to perform the heavier tasks whilst Edge Computing technology allows increased mobility and real-time processing thus increasing efficiency at both ends of an organisation’s IT chain. Wearables, such as smart glasses, will work in harmony with Edge Computing helping to both streamline processes within organisations in ever more remote or mobile environments. Take the NHS for example, utilising a wide variety of end-point devices such as smart glasses to access locally stored data, healthcare providers can collect and analyse patient data from the edge in real time whilst interacting with patients. Enabling healthcare providers to dramatically increase their efficiency when consulting with patients, whilst more data can be sent to the cloud for further diagnosis.
Mobile zero clients
As organisations adopt more mobile and flexible working practices, security needs to be the number one priority to successfully embrace the benefits of mobility without falling victim to the increased threat of cyber-crime. Organisations have already seen the benefits of thin client solutions, however, because of cost and limitations that restrict remote working, more and more will move towards zero client solutions which completely remove storage from devices, using external servers to drive operating systems with data access through a cloud-based virtual desktop infrastructure (VDI). By using zero client solutions data is protected against malware and security issues should a device be lost or stolen.
Whether preparing for regulatory change, protecting against the ever-expanding cyber threat landscape, or addressing trends like mobile working, 2018 will be a year of digital transformation and learning for many organisations. While technology such as quantum cryptography is still evolving, it is already offering equal opportunities for cryptographers and hackers, so organisations need to start considering how it will impact them now.
The General Data Protection Regulation (GDPR) comes into effect on 25 May 2018, and while it may seem like there’s plenty of time left to achieve compliance, it’s important businesses begin making changes early. The new regulations contain critical obligations and whilst there is still time to meet the deadline, you must start creating a plan sooner rather than later, according to Paula Tighe, Information Governance Director at leading law firm Wright Hassall.
The General Data Protection Regulation (GDPR) comes into effect on 25 May 2018, and while it may seem like there’s plenty of time left to achieve compliance, it’s important businesses begin making changes early. The new regulations contain critical obligations and whilst there is still time to meet the deadline, you must start creating a plan sooner rather than later, according to Paula Tighe, Information Governance Director at leading law firm Wright Hassall.
Regardless of the size of your business and the amount of data involved, the basic principles will be the same and should start with a comprehensive plan agreed between the people who will need to drive through the necessary changes.
You must remember that though the UK has voted to leave the EU, it has no bearing on the new law. If your data is obtained or processed within the EU, then you must still meet the new requirements.
Raise awareness and register it
The first step is to ensure that senior members of your business understand the new GDPR laws, and begin to push through the necessary changes as early as possible.
One of the most important ways of protecting your business is to record the compliance process, making a note of any significant changes to company policies.
Also known as the data register, this record will hold important information about the personal data your company currently has recorded, as well as details about how this data was obtained and why it is being processed.
Review and amend your processes
Rather than preventing you from doing things, compliance aims to improve standards by encouraging you to review existing procedures, and improve them where possible.
Start by reviewing your existing digital and hard copy format privacy notices and policies to ensure they are concise, written in clear language, easy to understand and easily found.
Finally, review how you communicate these notices and policies with data subjects, ensuring you explain your reason for processing the data, how long you will keep it and how individuals can complain to the Information Commissioner’s Office if they think you’re doing something wrong.
Rights of the individual
Post-GDPR, individuals will enjoy greater control over their personal data, which includes the right to have their information edited or even removed completely.
Therefore, it is crucial that your company introduces procedures that can deal with any such request quickly and efficiently – GDPR requires you to have effective processes in place, so you may even be asked to prove it.
Perhaps one of the key drivers for the changes, is the right for an individual to prevent their data being used for direct marketing purposes, as is the right to challenge and prevent automated decision-making and profiling.
Having transparent procedures in place will go a long way towards heading off any future problems with the regulator, regardless of complaints or investigations. Remember, if your organisation handles personal data correctly under the current Data Protection Act, then the switch to the GDPR should pose no real problems.
Prepare for personal requests
If an individual makes a subject access request (to see what information you hold on them) you must be able to comply within a month and for which you cannot charge. You can refuse to comply if you think the request has no merit – but you must tell the individual why and that they have the right to complain to the regulator.
Key areas to remember is have a procedure to identify requests, assess if they are not excessive which makes them impossible to respond to and have a transparent approach to acknowledging and disclosing the data in accordance with the GDPR.
Again, in all reality, for SMEs it will be more important to show a willingness to comply by endeavouring to put in place all the necessary steps and recording the process in the data register, than it will be to be fully compliant on day one.
Never assume you have consent
This sounds simple, but might in fact be one of the trickier areas of the new regulations: consent for personal data to be captured and used for more than just contact.
Although an individual must give clear consent for their data to be used, they must be allowed to revoke their consent just as easily, at any time. And, if you change the way you want to use their data, sharing it with a new partner for instance, you must obtain a new consent.
Again, whilst consent can never be inferred and must be implicit, your attempt to obtain and confirm consent, even if you do not receive a reply, will help mitigate any future problems at the hands of the regulator.
Keep reviewing and keep recording
Under the GDPR and when you are obtaining and processing personal and sensitive categories of data, you need to record how this data will be retained and under what condition; for example, is the retention period required for legal, regulation and/or organisational purposes.
The new regulations bring a requirement for all business effected by the GDPR to not only have a retention (data minimisation) policy and schedule, but to carry out mandatory Privacy Impact Assessments (PIA) if they want to process personal as part of normal business practices, or if it is to be processed on a new technological or information society system, or if it contains sensitive categories of data.
These assessments will help you decide what are the likely effects on the individual, mitigate any risk and help you build in ‘privacy by design’ in how you obtain and process individuals data. Ensure you have a robust process for making the assessments and then record it, along with the outcome – a PIA is a simple step towards compliance, with the emphasis on what you do, rather than what you say you will do.
Make someone responsible and keep it up
If your company handles large quantities of data on a regular basis then it may be worth appointing a dedicated Data Protection Officer to oversee ongoing procedures, ensuring you are compliant at all times.
This does not necessarily have to be someone from within your organisation – many businesses prefer to employ someone on a part-time or consultancy basis to save on costs. It is also important to brief your employees on the new laws and teach them how to process sensitive data properly.
It’s not just electronically-held data that can pose a problem; you need to be aware of other data records including index cards held within your organisation – these too are covered by the regulations.
Finally, ensure you record the entire compliance process using your data register, as this can be the difference between incurring penalties for non-compliance and being given the benefit of the doubt.
Those organisations that can prove they have made an effort to meet the new requirements will fare a lot better than those who disregard the new changes altogether.
About the author: Experienced in working with small, medium and large private and public bodies, Paula advises on a range of data protection issues, including training design and delivery, marketing, housing, project management and ICT security.
About the firm: Wright Hassall, a full-service law firm, advises clients across a variety of sectors including advanced manufacturing and engineering; food and agriculture; housing, development and construction; and gaming and digital media.
Mark Hickman, Chief Operating Officer at WinMagic, looks at the barriers to putting more workloads in the cloud, citing one area that can knock them down.
Putting more in the cloud
IT departments are clearly seeing benefits from the cloud, or these kinds of predictions for the overall market would not exist. But take a deeper look at what is happening within companies, and the migration to cloud is not without its problems. In our survey, when we asked ITDMs about their top three concerns on placing future workloads in the cloud, 58% reported overall security as their top concern, followed by specifically protecting sensitive data from unauthorised access (55%) and the increased complexity of infrastructure (44%).
One simple example of the problems this creates is that on average, companies report they use three isolated vendor encryption solutions to protect data across cloud and on-premises infrastructure. It is no wonder that a third (33%) of respondents also reported that data is only partially encrypted in the cloud, and 39% admitted to not having unbroken security audit trails across virtual machines in the cloud, leaving them exposed to failed audits, data breaches, legal proceedings, fines and damage to their corporate reputation.
Putting more in the cloud
IT departments are clearly seeing benefits from the cloud, or these kinds of predictions for the overall market would not exist. But take a deeper look at what is happening within companies, and the migration to cloud is not without its problems. In our survey, when we asked ITDMs about their top three concerns on placing future workloads in the cloud, 58% reported overall security as their top concern, followed by specifically protecting sensitive data from unauthorised access (55%) and the increased complexity of infrastructure (44%).
One simple example of the problems this creates is that on average, companies report they use three isolated vendor encryption solutions to protect data across cloud and on-premises infrastructure. It is no wonder that a third (33%) of respondents also reported that data is only partially encrypted in the cloud, and 39% admitted to not having unbroken security audit trails across virtual machines in the cloud, leaving them exposed to failed audits, data breaches, legal proceedings, fines and damage to their corporate reputation.
Compliance fog
New legislation, such as the EU General Data Protection Regulation which comes into enforcement in May 2018, will see companies required to carefully manage the encryption, storage, use and sharing of personally identifiable information. Failure to comply with this legislation can result in fines equivalent to 4% of annual turnover, or €20m, whichever is the greater.
Yet as companies prepare for the new legislative framework, there seems to be some confusion when it comes to that same data in the cloud. Only 39% of the participants in our survey consider themselves ultimately responsible for the compliance of data stored with cloud services. Worryingly, one fifth believe it is solely the responsibility of the cloud service provider, whilst a further fifth believe they are covered by their cloud service provider’s SLA.
It is critical for ITDMs to understand and accept that their company is ultimately responsible for data wherever it resides. The service level agreements of cloud service providers do not cover data protection. They cover the protection of the cloud, rather than what is in it. It is only by understanding the extent of their responsibilities that ITDMs can fully appreciate their responsibilities with regard to data protection, and legislation such as GDPR.
Good management can break down the barriers to adoption
All cloud workloads need to be effectively managed to ensure secure access, compliance, and protection against threats, both internal and external. But these cannot and should not be treated as a separate management challenge to on-premises servers and services. Management must cross these inferred boundaries, otherwise we are leaving a chasm in which security and compliance fail. As we all know, this can lead to data breaches or regulatory failures that impact customer trust and corporate reputation and can lead to substantial fines or operational costs to rectify the situation.
Expanding infrastructure into the cloud has come at a cost for the majority of companies, with a greater burden on IT teams. Over half (55%) reported needing to use more management tools since migrating workloads to the cloud - sometimes needing multiple tools for the same task. Over half again (53%) reported spending more time on management tasks than ever before. Asked what they would use time saved on management tasks for, they said:
So, as we can see, we have a situation where some of the benefits of the cloud are being seen, but at the expense of internal IT staff resources. In the same way that management tools have played a key role in allowing IT departments to better visualise and manage the deployment of virtual machines over last 10 years, they can play the same role for cloud services. But the inescapable fundamental with cloud services is that their lack of physical visibility within the IT infrastructure makes them prone to human error when it comes to management and administration. Investing in cloud services and the security tools is not enough, tools that can manage all these areas are essential.
Conclusion
At its heart, using heterogeneous cloud environments is making it harder for businesses to manage security and compliance, leaving staff firefighting rather than focusing on other areas. Companies need to think about choosing management tools that are cloud agnostic, and which remove complexity, so they can focus on exploiting the benefits of cloud computing, such as investing in-house resources on projects that will advance the business. Financially, it is easy to make an argument for the cloud to account for a larger part of your IT infrastructure, but without the right management tools and processes in place, all of the issues we have discussed here can emerge and create a perfect storm that leaves the company exposed to compliance and security risks.